Emergent Predication Structure in Vector Representations of Neural Readers

نویسندگان

  • Hai Wang
  • Takeshi Onishi
  • Kevin Gimpel
  • David McAllester
چکیده

Reading comprehension is a question answering task where the answer is to be found in a given passage about entities and events not mentioned in general knowledge sources. A significant number of neural architectures for this task (neural readers) have recently been developed and evaluated on large cloze-style datasets. We present experiments supporting the emergence of “predication structure” in the hidden state vectors of a class of neural readers including the Attentive Reader and Stanford Reader. We posits that the hidden state vectors can be viewed as (a representation of) a concatenation [P, c] of a “predicate vector” P and a “constant symbol vector” c and that the hidden state represents the atomic formula P (c). This predication structure plays a conceptual role in relating “aggregation readers” such as the Attentive Reader and the Stanford Reader to “explicit reference readers” such as the Attention-Sum Reader, the Gated-Attention Reader and the Attention-over-Attention Reader. In an independent contribution, we show that the addition of linguistics features to the input to existing neural readers significantly boosts performance yielding the best results to date on the Who-did-What dataset.1

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Emergent Predication Structure in Hidden State Vectors of Neural Readers

A significant number of neural architectures for reading comprehension have recently been developed and evaluated on large cloze-style datasets. We present experiments supporting the emergence of “predication structure” in the hidden state vectors of these readers. More specifically, we provide evidence that the hidden state vectors represent atomic formulas Φ[c] where Φ is a semantic property ...

متن کامل

A Joint Semantic Vector Representation Model for Text Clustering and Classification

Text clustering and classification are two main tasks of text mining. Feature selection plays the key role in the quality of the clustering and classification results. Although word-based features such as term frequency-inverse document frequency (TF-IDF) vectors have been widely used in different applications, their shortcoming in capturing semantic concepts of text motivated researches to use...

متن کامل

Embedding Probabilities in Predication Space with Hermitian Holographic Reduced Representations

Predication-based Semantic Indexing (PSI) is an approach to generating high-dimensional vector representations of concept-relation-concept triplets. In this paper, we develop a variant of PSI that accommodates estimation of the probability of encountering a particular predication (such as fluoxetine TREATS major depressive disorder) in a collection of predications concerning a concept of intere...

متن کامل

Emergent latent symbol systems in recurrent neural networks

Fodor and Pylyshyn (1988) famously argued that neural networks cannot behave systematically short of implementing a combinatorial symbol system. A recent response from Frank et al. (2009) claimed to have trained a neural network to behave systematically without implementing a symbol system and without any in-built predisposition towards combinatorial representations. We believe systems like the...

متن کامل

Visualizing polysemy using LSA and the predication algorithm

Context is a determining factor in language, and plays a decisive role in polysemic words. Several psycholinguistically-motivated algorithms have been proposed to emulate human management of context, under the assumption that the value of a word is evanescent and takes on meaning only in interaction with other structures. The predication algorithm (Kintsch, 2001), for example, uses a vector rep...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016